Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 96
Filter
1.
Article in English | MEDLINE | ID: mdl-38522764

ABSTRACT

BACKGROUND: Identification of differences in mortality risk between female and male heart transplant recipients may prompt sex-specific management strategies. Because worldwide, males of all ages have higher absolute mortality rates than females, we aimed to compare the excess risk of mortality (risk above the general population) in female vs male heart transplant recipients. METHODS: We used relative survival models conducted separately in SRTR and CTS cohorts from 1988-2019, and subsequently combined using 2-stage individual patient data meta-analysis, to compare the excess risk of mortality in female vs male first heart transplant recipients, accounting for the modifying effects of donor sex and recipient current age. RESULTS: We analyzed 108,918 patients. When the donor was male, female recipients 0-12 years (Relative excess risk (RER) 1.13, 95% CI 1.00-1.26), 13-44 years (RER 1.17, 95% CI 1.10-1.25), and ≥45 years (RER 1.14, 95% CI 1.02-1.27) showed higher excess mortality risks than male recipients of the same age. When the donor was female, only female recipients 13-44 years showed higher excess risks of mortality than males (RER 1.09, 95% CI 1.00-1.20), though not significantly (p = 0.05). CONCLUSIONS: In the setting of a male donor, female recipients of all ages had significantly higher excess mortality than males. When the donor was female, female recipients of reproductive age had higher excess risks of mortality than male recipients of the same age, though this was not statistically significant. Further investigation is required to determine the reasons underlying these differences.

2.
Transplantation ; 2024 Jan 26.
Article in English | MEDLINE | ID: mdl-38277260

ABSTRACT

BACKGROUND: Kidney transplant recipients show sex differences in excess overall mortality risk that vary by donor sex and recipient age. However, whether the excess risk of death with graft function (DWGF) differs by recipient sex is unknown. METHODS: In this study, we combined data from 3 of the largest transplant registries worldwide (Scientific Registry of Transplant Recipient, Australia and New Zealand Dialysis and Transplant Registry, and Collaborative Transplant Study) using individual patient data meta-analysis to compare the excess risk of DWGF between male and female recipients of a first deceased donor kidney transplant (1988-2019), conditional on donor sex and recipient age. RESULTS: Among 463 895 individuals examined, when the donor was male, female recipients aged 0 to 12 y experienced a higher excess risk of DWGF than male recipients (relative excess risk 1.68; 95% confidence interval, 1.24-2.29); there were no significant differences in other age intervals or at any age when the donor was female. There was no statistically significant between-cohort heterogeneity. CONCLUSIONS: Given the lack of sex differences in the excess risk of DWGF (other than in prepubertal recipients of a male donor kidney) and the known greater excess overall mortality risk for female recipients compared with male recipients in the setting of a male donor, future study is required to characterize potential sex-specific causes of death after graft loss.

3.
Nephrol Dial Transplant ; 39(4): 607-617, 2024 Mar 27.
Article in English | MEDLINE | ID: mdl-37596063

ABSTRACT

BACKGROUND: There is a known recipient sex-dependent association between donor sex and kidney transplant survival. We hypothesized that donor age also modifies the association between donor sex and graft survival. METHODS: First, deceased donor kidney transplant recipients (1988-2019, n = 461 364) recorded in the Scientific Registry of Transplant Recipients, the Australia and New Zealand Dialysis and Transplant Registry and the Collaborative Transplant Study were analyzed. We used multivariable Cox regression models to estimate the association between donor sex and death censored graft loss, accounting for the modifying effects of recipient sex and donor age; donor age was categorized as 5-19, 20-34, 35-49, 50-59 and ≥60 years. Results from cohort-specific Cox models were combined using individual patient data meta-analysis. RESULTS: Among female recipients of donors aged <60 years, graft loss hazards did not differ by donor sex; recipients of female donors ≥60 years showed significantly lower graft loss hazards than recipients of male donors of the same age [combined adjusted hazard ratio (aHR) 0.90, 95% CI 0.86-0.94]. Among male recipients, female donors aged <50 years were associated with significantly higher graft loss hazards than same-aged male donors (5-19 years: aHR 1.11, 95% CI 1.02-1.21; 20-34 years: aHR 1.08, 95% CI 1.02-1.15; 35-49 years: aHR 1.07, 95% CI 1.04-1.10). There were no significant differences in graft loss by donor sex among male recipients of donors aged ≥50 years. CONCLUSION: Donor age modifies the association between donor sex and graft survival. Older female donors were associated with similar or lower hazards of graft failure than older male donors in both male and female recipients, suggesting a better functional reserve of older female donor kidneys.


Subject(s)
Kidney Transplantation , Humans , Male , Female , Renal Dialysis , Tissue Donors , Kidney , Proportional Hazards Models , Registries , Graft Survival , Graft Rejection
4.
Kidney Int ; 103(6): 1131-1143, 2023 06.
Article in English | MEDLINE | ID: mdl-36805451

ABSTRACT

Worldwide and at all ages, males have a higher mortality risk than females. This mortality bias should be preserved in kidney transplant recipients unless there are sex differences in the effects of transplantation. Here we compared the excess risk of mortality (risk above the general population) in female versus male recipients of all ages recorded in three large transplant databases. This included first deceased donor kidney transplant recipients and accounted for the modifying effects of donor sex and recipient age. After harmonization of variables across cohorts, relative survival models were fitted in each cohort separately and results were combined using individual patient data meta-analysis among 466,892 individuals (1988-2019). When the donor was male, female recipients 0-12 years (Relative Excess Risk 1.54, 95% Confidence Interval 1.20-1.99), 13-24 years (1.17, 1.01-1.34), 25-44 years (1.11, 1.05-1.18) and 60 years and older (1.05, 1.02-1.08) showed higher excess mortality risks than male recipients of the same age. When the donor was female, the Relative Excess Risk for those over 12 years were similar to those when the donor was male. There is a higher excess mortality risk in female than male recipients with differences larger at younger than older ages and only statistically significant when the donor was male. While these findings may be partly explained by the known sex differences in graft loss risks, sex differences in the risks of death with graft function may also contribute. Thus, higher risks in females than males suggest that management needs to be modified to optimize transplant outcomes among females.


Subject(s)
Kidney Transplantation , Humans , Male , Female , Kidney Transplantation/adverse effects , Cohort Studies , Sex Characteristics , Graft Survival , Tissue Donors , Transplant Recipients
6.
Front Immunol ; 13: 829228, 2022.
Article in English | MEDLINE | ID: mdl-35401541

ABSTRACT

Natural killer (NK) cells may contribute to antibody-mediated rejection (ABMR) of renal allografts. The role of distinct NK cell subsets in this specific context, such as NK cells expressing the activating receptor NKG2C, is unknown. Our aim was to investigate whether KLRC2 gene deletion variants which determine NKG2C expression affect the pathogenicity of donor-specific antibodies (DSA) and, if so, influence long-term graft survival. We genotyped the KLRC2wt/del variants for two distinct kidney transplant cohorts, (i) a cross-sectional cohort of 86 recipients who, on the basis of a positive post-transplant DSA result, all underwent allograft biopsies, and (ii) 1,860 recipients of a deceased donor renal allograft randomly selected from the Collaborative Transplant Study (CTS) database. In the DSA+ patient cohort, KLRC2wt/wt (80%) was associated with antibody-mediated rejection (ABMR; 65% versus 29% among KLRC2wt/del subjects; P=0.012), microvascular inflammation [MVI; median g+ptc score: 2 (interquartile range: 0-4) versus 0 (0-1), P=0.002], a molecular classifier of ABMR [0.41 (0.14-0.72) versus 0.10 (0.07-0.27), P=0.001], and elevated NK cell-related transcripts (P=0.017). In combined analyses of KLRC2 variants and a functional polymorphism in the Fc gamma receptor IIIA gene (FCGR3A-V/F158), ABMR rates and activity gradually increased with the number of risk genotypes. In DSA+ and CTS cohorts, however, the KLRC2wt/wt variant did not impact long-term death-censored graft survival, also when combined with the FCGR3A-V158 risk variant. KLRC2wt/wt may be associated with DSA-triggered MVI and ABMR-associated gene expression patterns, but the findings observed in a highly selected cohort of DSA+ patients did not translate into meaningful graft survival differences in a large multicenter kidney transplant cohort not selected for HLA sensitization.


Subject(s)
Kidney Transplantation , Cross-Sectional Studies , Graft Rejection , Humans , Isoantibodies , Kidney Transplantation/adverse effects , NK Cell Lectin-Like Receptor Subfamily C/genetics , NK Cell Lectin-Like Receptor Subfamily D , Receptors, Natural Killer Cell
7.
Exp Clin Transplant ; 20(1): 19-27, 2022 01.
Article in English | MEDLINE | ID: mdl-35060445

ABSTRACT

OBJECTIVES: Our country Croatia is among the global leaders regarding deceased donation rates, yet we are facing organ shortage and concurrently a sharp decline in our acceptance rates for kidney offers. To reevaluate our organ acceptance policy, we retrospectively analyzed the factors that influenced the posttransplant outcomes of kidneys from elderly deceased donors at our center during a 20-year period and the changes to our organ acceptance criteria during Eurotransplant membership. MATERIALS AND METHODS: We studied all kidney transplants from donors ≥60 years old during the two 5-year episodes of Eurotransplant membership from 2007 to 2017 (period II and period III) and compared those data to data from the decade before Eurotransplant membership (period I, 1997-2007). Differences in acceptance rates and reasons for the decline of kidney offers between the two 5-year periods of Eurotransplant membership were analyzed. RESULTS: In period I, 14.1% of all kidney allografts were obtained from donors ≥60 years old; in period II and period III the rates were nearly 2-fold higher (27.0% and 25.7%, respectively; P = .007 and P = .008). During the first 5-year period of Eurotransplant membership (period II), we accepted significantly more grafts from marginal donors with a higher number of human leukocyte antigen mismatches compared with period I. Consequently, the 3-month survival rate of kidneys from donors ≥60 years old dropped from 91.1% to as low as 74.2% (P = .034). After application of morestringent human leukocyte antigen matching, especially in human leukocyte antigen DR, and morestringent donor acceptance criteria in period III, graft survival improved to 91.1%. CONCLUSIONS: Our experience indicates that careful selection of kidneys from elderly deceased donors and allocation to human leukocyte antigen-matched recipients is important to improve transplant outcomes.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Aged , Croatia , Graft Survival , Humans , Kidney Transplantation/adverse effects , Middle Aged , Registries , Retrospective Studies , Tissue Donors , Treatment Outcome
8.
Transplantation ; 106(4): e212-e218, 2022 04 01.
Article in English | MEDLINE | ID: mdl-35066544

ABSTRACT

BACKGROUND: Patients aged ≥60 y represent the fastest growing population among kidney transplant recipients and waitlist patients. They show an elevated infection risk and are frequently transplanted with multiple human leukocyte antigen mismatches. Whether the choice of calcineurin inhibitor influences graft survival, mortality, or key secondary outcomes such as infections in this vulnerable recipient population is unknown. METHODS: A total of 31 177 kidney transplants from deceased donors performed between 2000 and 2019 at European centers and reported to the Collaborative Transplant Study were analyzed using multivariable Cox and logistic regression analyses. All recipients were ≥60 y old and received tacrolimus (Tac) or cyclosporine A on an intention-to-treat basis, combined with mycophenolic acid or azathioprine plus/minus steroids. RESULTS: The risk of 3-y death-censored graft loss and patient mortality did not differ significantly between Tac- and cyclosporine A-treated patients (hazard ratio 0.98 and 0.95, P = 0.74 and 0.20, respectively). No difference was found in the overall risk of hospitalization for infection (hazard ratio = 0.95, P = 0.19); however, a lower incidence of rejection treatment (hazard ratio = 0.81, P < 0.001) was observed in Tac-treated patients. Assessment of pathogen-specific hospitalizations revealed no difference in the risk of hospitalization due to bacterial infection (odds ratio = 1.00, P = 0.96), but a significantly higher risk of hospitalization due to human polyomavirus infection was found among Tac-treated patients (odds ratio = 2.45, P = 0.002). The incidence of de novo diabetes was higher for Tac-based immunosuppression (odds ratio = 1.79, P < 0.001). CONCLUSIONS: Calcineurin inhibitor selection has no significant influence on death-censored graft survival, mortality, and overall infection risk in ≥60-y-old kidney transplant recipients.


Subject(s)
Calcineurin Inhibitors , Kidney Transplantation , Aged , Calcineurin Inhibitors/adverse effects , Cyclosporine/adverse effects , Graft Rejection/mortality , Graft Rejection/prevention & control , Graft Survival/immunology , Humans , Immunosuppressive Agents/adverse effects , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Mycophenolic Acid/adverse effects , Tacrolimus/adverse effects , Transplant Recipients
9.
Transplantation ; 106(7): 1473-1484, 2022 07 01.
Article in English | MEDLINE | ID: mdl-34974454

ABSTRACT

BACKGROUND: Sex differences in kidney graft loss rates were reported in the United States. Whether these differences are present in other countries is unknown. METHODS: We estimated the association between recipient sex and death-censored graft loss in patients of all ages recorded in the Scientific Registry of Transplant Recipients, Australia and New Zealand Dialysis and Transplant Registry, and Collaborative Transplant Study registries who received a first deceased donor kidney transplant (1988-2019). We used multivariable Cox regression models, accounting for the modifying effects of donor sex and recipient age, in each registry separately; results were combined using individual patient data meta-analysis. RESULTS: We analyzed 438 585 patients. Young female patients 13-24 y old had the highest crude graft loss rates (female donor: 5.66; male donor: 5.50 per 100 person-years). Among young recipients of male donors, females showed higher graft loss risks than males (0-12 y: adjusted hazard ratio [aHR] 1.42, (95% confidence interval [CI], 1.17-1.73); 13-24 y: 1.24 (1.17-1.32); 25-44 y: 1.09 (1.06-1.13)). When the donor was female, there were no significant differences by recipient sex among those of age <45 y; however, the aHR for females was 0.93 (0.89-0.98) in 45-59 y-old and 0.89 (0.86-0.93) in ≥ 60 y-old recipients. Findings were similar for all 3 registries in most age intervals; statistically significant heterogeneity was seen only among 13-24-y-old recipients of a female donor (I2 = 71.5%, P = 0.03). CONCLUSIONS: There is an association between recipient sex and kidney transplantation survival that is modified by recipient age and donor sex.


Subject(s)
Kidney Transplantation , Female , Graft Rejection , Graft Survival , Humans , Kidney Transplantation/methods , Male , Registries , Sex Characteristics , Tissue Donors , Transplant Recipients , United States/epidemiology
10.
Pediatr Transplant ; 26(1): e14154, 2022 Feb.
Article in English | MEDLINE | ID: mdl-34612565

ABSTRACT

BACKGROUND: Approximately 1700 children per year with end-stage kidney disease undergo kidney transplantation in Europe and the United States of America; 30%-50% are living donor kidney transplantations. There may be immunological differences between paternal and maternal donors due to transplacental exchange of cells between the mother and fetus during pregnancy leading to microchimerism. We investigated whether the outcome of living-related kidney transplantation in young children is different after maternal compared with paternal organ donation. METHODS: Using the international Collaborative Transplant Study (CTS) database, we analyzed epidemiological data of 7247 children and adolescents aged <18 years who had received a kidney transplant from either mother or father. Risk of treated rejection episodes and death-censored graft failure were computed using the Kaplan-Meier method and multivariable Cox regression. RESULTS: In the recipient age group 1-4 years, the rate of treated rejection episodes in recipients of kidneys from maternal donors (N = 195) during the first 2 years post-transplant was significantly lower (hazard ratio HR = 0.47, p = .004) than in patients receiving kidneys from paternal donors (N = 179). This association between donor sex and risk of treated rejections was not observed in children aged 5-9 years. The 5-year death-censored graft survival in children aged 1-4 years with a maternal or paternal donor was comparable. CONCLUSIONS: Maternal kidney donation in young pediatric renal transplant recipients is associated with an approximately 50% lower rate of treated rejection than paternal kidney donation. Whether this phenomenon is due to maternal microchimerism-induced donor-specific hyporesponsiveness must be evaluated in prospective mechanistic studies.


Subject(s)
Graft Rejection/immunology , Graft Survival/immunology , Kidney Failure, Chronic/surgery , Kidney Transplantation/methods , Living Donors , Parents , Adolescent , Age Factors , Child , Child, Preschool , Female , Graft Rejection/epidemiology , Graft Rejection/prevention & control , Humans , Infant , Kaplan-Meier Estimate , Male , Proportional Hazards Models , Registries , Retrospective Studies , Risk Factors
11.
Liver Transpl ; 28(5): 807-818, 2022 05.
Article in English | MEDLINE | ID: mdl-34806843

ABSTRACT

Split-liver transplantation offers a solution to the organ shortage problem. However, the outcomes of extended right lobe liver transplantation (ERLT) and whether it is a suitable alternative to full-size liver transplantation (FSLT) remain controversial. We compared the outcomes of ERLT and FSLT in adult recipients of 43,409 first deceased donor liver transplantations using Cox regression. We also analyzed 612 ERLT and 1224 FSLT 1:2 matched cases to identify factors that affect ERLT outcome. The risk of graft loss was significantly higher following ERLT than following FSLT during the first posttransplantation year in the matched and unmatched collective (hazard ratio [HR], 1.39 and 1.27 and P = 0.01 and 0.006, respectively). Every additional hour of cold ischemia time (CIT) increased the risk of 1-year graft loss by 10% in the ERLT group compared with only 3% in the FSLT group (P = 0.003 and <0.001, respectively). Importantly, the outcome of ERLT and FSLT did not differ significantly if the CIT was below 10 hours (HR, 0.71; P = 0.22). One-year graft and patient survival were lower in high-risk ERLT recipients with a Model for End-Stage Liver Disease (MELD) score of ≥20 (HR, 1.88; P = 0.03 and HR, 2.03; P = 0.02). In the male recipient-male donor combination, ERLT recipients had a higher risk of 1-year graft loss than FSLT recipients (HR, 2.44; P = 0.006). This was probably because of the significantly higher MELD score in ERLT recipients (P = 0.004). ERLT in adults is an adequate alternative to FSLT and offers an elegant solution to the problem of organ shortage as long as the cold storage is less than 10 hours and the recipient's MELD score is <20.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Adult , End Stage Liver Disease/surgery , Graft Survival , Humans , Liver Transplantation/adverse effects , Living Donors , Male , Retrospective Studies , Severity of Illness Index , Treatment Outcome
12.
Front Surg ; 8: 678392, 2021.
Article in English | MEDLINE | ID: mdl-34926560

ABSTRACT

Introduction: Hepatocellular carcinoma (HCC) is by far the leading malignant indication for liver transplantation (LT). Few other malignancies, including cholangiocellular carcinoma (CCC), metastases from neuroendocrine tumors (NET), and sarcomas of the liver (LSAR), also are commonly accepted indications for LT. However, there is limited information on their outcome after LT. Methods: Graft and patient survival in 14,623 LTs performed in patients with hepatocellular carcinoma, CCC, NET, and LSAR from 1988 to 2017 and reported to the Collaborative Transplant Study were analyzed. Results: The study group consisted of 13,862 patients who had HCC (94.8%), 498 (3.4%) who had CCC, 100 (0.7%) who had NET, and 163 (1.1%) who had LSAR. CCC patients showed a 5-year graft survival rate of 32.1%, strikingly lower than the 63.2% rate in HCC, 51.6% rate in NET, and 64.5% rate in LSAR patients (P < 0.001 for all vs. CCC). Multivariable Cox regression analysis revealed a significantly higher risk of graft loss and death due to cancer during the first five post-transplant years in CCC vs. HCC patients (HR 1.77 and 2.56; P < 0.001 for both). The same risks were increased also in NET and LSAR patients but did not reach statistical significance. Conclusion: Among patients with rare malignant indications for LT, CCC patients showed significantly impaired graft as well as patient survival compared to HCC patients. The observed differences might challenge traditional decision-making processes for LT indication and palliative treatment in specific hepatic malignancies.

13.
Front Immunol ; 12: 724331, 2021.
Article in English | MEDLINE | ID: mdl-34497614

ABSTRACT

The functional Fc gamma receptor (FcγR) IIIA polymorphism FCGR3A-V/F158 was earlier suggested to determine the potential of donor-specific HLA antibodies to trigger microcirculation inflammation, a key lesion of antibody-mediated renal allograft rejection. Associations with long-term transplant outcomes, however, have not been evaluated to date. To clarify the impact of FCGR3A-V/F158 polymorphism on kidney transplant survival, we genotyped a cohort of 1,940 recipient/donor pairs. Analyzing 10-year death-censored allograft survival, we found no significant differences in relation to FCGR3A-V/F158. There was also no independent survival effect in a multivariable Cox model. Similarly, functional polymorphisms in two other activating FcγR, FCGR2A-H/R131 (FcγRIIA) and FCGR3B-NA1/NA2 (FcγRIIIB), were not associated with outcome. There were also no significant survival differences among patient subgroups at increased risk of rejection-related injury, such as pre-sensitized recipients (> 0% panel reactivity; n = 438) or recipients treated for rejection within the first year after transplantation (n = 229). Our study results suggest that the earlier shown association of FcγR polymorphism with microcirculation inflammation may not be strong enough to exert a meaningful effect on graft survival.


Subject(s)
Genotype , Graft Rejection/genetics , Receptors, IgG/genetics , Adult , Allografts , Female , Graft Rejection/immunology , Humans , Isoantibodies/metabolism , Kidney Transplantation , Male , Middle Aged , Polymorphism, Single Nucleotide , Retrospective Studies , Survival Analysis
14.
Transplantation ; 105(11): 2461-2469, 2021 11 01.
Article in English | MEDLINE | ID: mdl-33988347

ABSTRACT

BACKGROUND: The use of kidney allografts from ≥70-y-old donors has increased persistently over the last 20 y. Prolonged cold ischemia time (CIT) is well known to increase graft failure risk. However, despite their growing importance, no data are available on the impact of CIT, specifically on survival of allografts from ≥70-y-old donors. METHODS: In total, 47 585 kidney transplantations from expanded criteria donors (ECDs) performed during 2000-2017 and reported to the Collaborative Transplant Study were analyzed. The impact of CIT on 5-y death-censored graft and patient survival was studied for transplantations from <70-y (n = 33 305) and ≥70-y-old ECDs (n = 14 280). RESULTS: Compared with the reference of ≤12 h CIT, a CIT of 13-18 h did not increase the risk of graft failure significantly, either for recipients of kidneys from <70-y or from ≥70-y-old ECDs. In contrast, graft failure risk increased significantly when CIT exceeded 18 h, both in recipients of kidneys from <70-y and, more pronounced, from ≥70-y-old ECDs (CIT 19-24 h: hazard ratio [HR] = 1.19 and 1.24; P < 0.001; CIT ≥24 h: HR = 1.28 and 1.32, P < 0.001 and P =0.003, respectively). Within the 18-h CIT interval, additional HLA matching further improved survival of ECD transplants significantly, whereas the negative impact of a prolonged CIT >18 h was stronger in ≥65-y-old recipients and for transplants with multiple HLA mismatches. The influence of CIT on patient survival was less pronounced. CONCLUSIONS: CIT, as long it is kept ≤18 h, has no significant impact on survival of kidney transplants, even from ≥70-y-old ECDs.


Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Aged , Cold Ischemia/adverse effects , Graft Survival , Humans , Kidney Transplantation/adverse effects , Tissue Donors
15.
Front Immunol ; 12: 631246, 2021.
Article in English | MEDLINE | ID: mdl-33717167

ABSTRACT

We analyzed in a cohort of 68,606 first deceased donor kidney transplantations reported to the Collaborative Transplant Study whether an epitope-based matching of donor-recipient pairs using the Predicted Indirectly ReCognizable HLA Epitopes algorithm (PIRCHE-II) is superior to currently applied HLA antigen matching. PIRCHE-II scores were calculated based on split antigen HLA-A, -B, -DRB1 typing and adjusted to the 0-6 range of HLA mismatches. PIRCHE-II scores correlated strongly with the number of HLA mismatches (Spearman ρ = 0.65, P < 0.001). In multivariable analyses both parameters were found to be significant predictors of 5-year death-censored graft loss with high prognostic power [hazard ratio (HR) per adjusted PIRCHE-II score = 1.102, per HLA mismatch = 1.095; z-value PIRCHE-II: 9.8, HLA: 11.2; P < 0.001 for both]. When PIRCHE-II scores and HLA mismatches were analyzed simultaneously, their predictive power decreased but remained significant (PIRCHE-II: P = 0.002; HLA: P < 0.001). Influence of PIRCHE-II was especially strong in presensitized and influence of HLA mismatches in non-sensitized recipients. If the level of HLA-incompatibility was low (0-3 mismatches), PIRCHE-II scores showed a low impact on graft survival (HR = 1.031) and PIRCHE-II matching did not have additional significant benefit (P = 0.10). However, if the level of HLA-incompatibility was high (4-6 mismatches), PIRCHE-II improved the positive impact of matching compared to applying the traditional HLA matching alone (HR = 1.097, P = 0.005). Our results suggest that the PIRCHE-II score is useful and can be included into kidney allocation algorithms in addition to HLA matching; however, at the resolution level of HLA typing that is currently used for allocation it cannot fully replace traditional HLA matching.


Subject(s)
Epitopes/immunology , Graft Rejection/immunology , HLA Antigens/immunology , Kidney Transplantation/methods , Adult , Aged , Cohort Studies , Female , Graft Rejection/prevention & control , Graft Survival , HLA Antigens/genetics , Humans , Kidney Transplantation/standards , Male , Middle Aged , Tissue Donors/statistics & numerical data
16.
Transpl Int ; 35: 10071, 2021.
Article in English | MEDLINE | ID: mdl-35185364

ABSTRACT

Main problem: Soluble urokinase plasminogen activator receptor (suPAR) is an immunological risk factor for kidney disease and a prognostic marker for cardiovascular events. Methods: We measured serum suPAR levels in a total of 1,023 kidney transplant recipients either before (cohort 1, n = 474) or at year 1 after transplantation (cohort 2, n = 549). The association of suPAR levels and all-cause and cardiovascular mortality was evaluated by multivariable Cox regression analysis. Results: The highest suPAR tertile compared to the two lower tertiles had a significantly higher risk of all-cause mortality in both cohorts separately (cohort 1: hazard ratio (HR) 1.92, 95% confidence interval (CI) 1.20-3.08, p = 0.007; cohort 2: HR = 2.78, 95% CI 1.51-5.13, p = 0.001) and combined (n = 1,023, combined HR = 2.14, 95% CI 1.48-3.08, p < 0.001). The association remained significant in the subgroup of patients with normal kidney function (cohort 2: HR = 5.40, 95% CI 1.42-20.5, p = 0.013). The increased mortality risk in patients with high suPAR levels was attributable mainly to an increased rate of cardiovascular death (n = 1,023, HR = 4.24, 95% CI 1.81-9.96, p < 0.001). Conclusion: A high suPAR level prior to and at 1 year after kidney transplantation was associated with an increased risk of patient death independent of kidney function, predominantly from cardiovascular cause.


Subject(s)
Kidney Transplantation , Receptors, Urokinase Plasminogen Activator , Biomarkers , Humans , Kidney Transplantation/adverse effects , Prognosis , Urokinase-Type Plasminogen Activator
17.
Int J Immunogenet ; 48(2): 201-210, 2021 Apr.
Article in English | MEDLINE | ID: mdl-32945128

ABSTRACT

Due to a widespread organ shortage, the use of expanded criteria donors (ECDs) in kidney transplantation has increased persistently, reaching approximately 40% in recent years. Whether human leucocyte antigen (HLA) matching between donor and recipient should be part of allocation algorithms in transplantation of ECD kidneys, and especially of ECD kidneys from ≥70-year-old donors, is still in question. To this end, 135,529 kidney transplantations performed between 2000 and 2017 and reported to the Collaborative Transplant Study were analysed and the impact of HLA-A+B+DR mismatches on death-censored graft and patient survival as well as on rejection episodes was investigated. Results were stratified according to donor status (standard criteria donor (SCD) versus ECD) and age of ECD. HLA incompatibility increased the five-year death-censored graft failure risk similarly strong in recipients of ECD and SCD transplants (hazard ratio (HR) per HLA mismatch 1.078 and 1.075, respectively; p < .001 for both). Its impact on rejection treatments during the first post-transplant year was also significant but slightly weaker for recipients of ECD transplants (risk ratio (RR) per HLA mismatch 1.10 for ECD transplants and 1.13 for SCD transplants; p < .001 for both). Mortality increased gradually from zero to six HLA mismatches in recipients of SCD transplants, whereas for ECD transplants a significant increase was notable only from zero to more than zero mismatches. A significant but slightly less pronounced impact of HLA incompatibility on graft failure was observed in transplants from ≥70- compared with <70-year-old ECDs (HR per mismatch 1.047 and 1.093; p = .009 and < 0.001, respectively). The influence of HLA mismatches on rejection treatments was the same for both ECD age groups (RR = 1.10, p < .001 and p = .004, respectively). Our data indicate that HLA matching should be part of allocation algorithms not only in transplantation of kidneys from SCDs but also from ECDs.


Subject(s)
Donor Selection/standards , HLA Antigens/immunology , Histocompatibility , Kidney Transplantation , Tissue Donors , Adult , Age Factors , Aged , Cadaver , Cause of Death , Cold Ischemia , Confounding Factors, Epidemiologic , Graft Rejection/immunology , Graft Rejection/prevention & control , Graft Rejection/therapy , Graft Survival/immunology , Humans , Hypertension/epidemiology , Immunosuppressive Agents/therapeutic use , Isoantibodies/biosynthesis , Isoantibodies/immunology , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/surgery , Kidney Transplantation/mortality , Middle Aged , Proportional Hazards Models , Survival Analysis , Tissue Donors/supply & distribution
18.
Front Immunol ; 11: 1886, 2020.
Article in English | MEDLINE | ID: mdl-32983110

ABSTRACT

Delayed graft function (DGF) occurs in a significant proportion of deceased donor kidney transplant recipients and was associated with graft injury and inferior clinical outcome. The aim of the present multi-center study was to identify the immunological and non-immunological predictors of DGF and to determine its influence on outcome in the presence and absence of human leukocyte antigen (HLA) antibodies. 1,724 patients who received a deceased donor kidney transplant during 2008-2017 and on whom a pre-transplant serum sample was available were studied. Graft survival during the first 3 post-transplant years was analyzed by multivariable Cox regression. Pre-transplant predictors of DGF and influence of DGF and pre-transplant HLA antibodies on biopsy-proven rejections in the first 3 post-transplant months were determined by multivariable logistic regression. Donor age ≥50 years, simultaneous pre-transplant presence of HLA class I and II antibodies, diabetes mellitus as cause of end-stage renal disease, cold ischemia time ≥18 h, and time on dialysis >5 years were associated with increased risk of DGF, while the risk was reduced if gender of donor or recipient was female or the reason for death of donor was trauma. DGF alone doubled the risk for graft loss, more due to impaired death-censored graft than patient survival. In DGF patients, the risk of death-censored graft loss increased further if HLA antibodies (hazard ratio HR=4.75, P < 0.001) or donor-specific HLA antibodies (DSA, HR=7.39, P < 0.001) were present pre-transplant. In the presence of HLA antibodies or DSA, the incidence of biopsy-proven rejections, including antibody-mediated rejections, increased significantly in patients with as well as without DGF. Recipients without DGF and without biopsy-proven rejections during the first 3 months had the highest fraction of patients with good kidney function at year 1, whereas patients with both DGF and rejection showed the lowest rate of good kidney function, especially when organs from ≥65-year-old donors were used. In this new era of transplantation, besides non-immunological factors, also the pre-transplant presence of HLA class I and II antibodies increase the risk of DGF. Measures to prevent the strong negative impact of DGF on outcome are necessary, especially during organ allocation for presensitized patients.


Subject(s)
Delayed Graft Function/immunology , Graft Rejection/immunology , HLA Antigens/immunology , Isoantibodies/blood , Kidney Transplantation/adverse effects , Adult , Aged , Biomarkers/blood , Delayed Graft Function/blood , Delayed Graft Function/diagnosis , Delayed Graft Function/mortality , Europe , Female , Graft Rejection/blood , Graft Rejection/diagnosis , Graft Rejection/mortality , Graft Survival , Humans , Kidney Transplantation/mortality , Male , Middle Aged , Prospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome
19.
Transpl Int ; 33(12): 1681-1692, 2020 12.
Article in English | MEDLINE | ID: mdl-32881096

ABSTRACT

Adolescent and young adult age is a high-risk window with an alarmingly increased likelihood of premature kidney graft loss due to immunological rejection. Using the large database of the Collaborative Transplant Study, we analyzed whether a more intense and less variable exposure to tacrolimus could counteract this young age-related enhanced immunoreactivity. Kidney graft recipients aged 12-23 years (n = 964) with a 1-year tacrolimus trough level between 4.0 and 10.9 ng/ml had a 5-year graft survival rate of 85.1%, significantly better than the poor 66.1% rate in patients with a trough level below 4.0 ng/ml who showed a 2.38-fold increased risk of graft loss in the multivariable analysis (P < 0.001). This association was not apparent in young children aged 0-11 years (n = 455) and less pronounced in adults aged 24-34 years (n = 1466). However, an intra-patient variability of tacrolimus (IPV) trough level ≥1.5 at post-transplant years 1 and 2 was associated with an increased graft loss risk in both 12- to 23-year-old and 0- to 11-year-old recipients (P < 0.001 and P = 0.045). Patients with high IPV made up as many as 30% of kidney graft recipients, indicating that a more intense and less variable exposure to tacrolimus could improve graft survival strongly in this high-risk group.


Subject(s)
Kidney Transplantation , Tacrolimus , Adolescent , Adult , Child , Child, Preschool , Graft Rejection , Graft Survival , Humans , Immunosuppressive Agents , Infant , Infant, Newborn , Kidney , Registries , Young Adult
20.
Front Immunol ; 11: 892, 2020.
Article in English | MEDLINE | ID: mdl-32477362

ABSTRACT

Introduction: Despite increasing awareness of the negative impact of cold ischemia time (CIT) in liver transplantation, its precise influence in different subgroups of liver transplant recipients has not been analyzed in detail. This study aimed to identify liver transplant recipients with an unfavorable outcome due to prolonged cold ischemia. Methods: 40,288 adult liver transplantations, performed between 1998 and 2017 and reported to the Collaborative Transplant Study were analyzed. Results: Prolonged CIT significantly reduced graft and patient survival only during the first post-transplant year. On average, each hour added to the cold ischemia was associated with a 3.4% increase in the risk of graft loss (hazard ratio (HR) 1.034, P < 0.001). The impact of CIT was strongest in patients with hepatitis C-related (HCV) cirrhosis with a 24% higher risk of graft loss already at 8-9 h (HR 1.24, 95% CI 1.05-1.47, P = 0.011) and 64% higher risk at ≥14 h (HR 1.64, 95% CI 1.30-2.09, P < 0.001). In contrast, patients with hepatocellular cancer (HCC) and alcoholic cirrhosis tolerated longer ischemia times up to <10 and <12 h, respectively, without significant impact on graft survival (P = 0.47 and 0.42). In HCC patients with model of end-stage liver disease scores (MELD) <20, graft survival was not significantly impaired in the cases of CIT up to 13 h. Conclusion: The negative influence of CIT on liver transplant outcome depends on the underlying disease, patients with HCV-related cirrhosis being at the highest risk of graft loss due to prolonged cold ischemia. Grafts with longer cold preservation times should preferentially be allocated to recipients with alcoholic cirrhosis and HCC patients with MELD <20, in whom the effect of cold ischemia is less pronounced.


Subject(s)
Cold Ischemia/adverse effects , Graft Survival , Liver Transplantation/adverse effects , Transplant Recipients/classification , Adolescent , Adult , Aged , Female , Graft Rejection , Humans , Intersectoral Collaboration , Male , Middle Aged , Proportional Hazards Models , Risk Factors , Survival Rate , Time Factors , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...